Training Autoencoders Using Relative Entropy Constraints

نویسندگان

چکیده

Autoencoders are widely used for dimensionality reduction and feature extraction. The backpropagation algorithm training the parameters of autoencoder model suffers from problems such as slow convergence. Therefore, researchers propose forward propagation algorithms. However, existing algorithms do not consider characteristics data itself. This paper proposes an based on relative entropy constraints, called (REAE). When solving map parameters, REAE imposes different constraints average activation value hidden layer outputs obtained by sets. In experimental section, compared applying features extracted to image classification task. results three datasets show that performance constructed is better than other

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pre-Training CNNs Using Convolutional Autoencoders

Despite convolutional neural networks being the state of the art in almost all computer vision tasks, their training remains a difficult task. Unsupervised representation learning using a convolutional autoencoder can be used to initialize network weights and has been shown to improve test accuracy after training. We reproduce previous results using this approach and successfully apply it to th...

متن کامل

Maximum Entropy Discrimination Denoising Autoencoders

Deep generative models (DGMs) have brought about a major breakthrough, as well as renewed interest, in generative latent variable models. However, an issue current DGM formulations do not address concerns the data-driven inference of the number of latent features needed to represent the observed data. Traditional linear formulations allow for addressing this issue by resorting to tools from the...

متن کامل

Relative $\alpha$-Entropy Minimizers Subject to Linear Statistical Constraints

We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...

متن کامل

Relative α-entropy minimizers subject to linear statistical constraints

We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...

متن کامل

A Representation Approach for Relative Entropy Minimization with Expectation Constraints

We consider the general problem of relative entropy minimization and entropy maximization subject to expectation constraints. We show that the solutions can be represented as members of an exponential family subject to weaker conditions than previously shown, and the representation can be simplified further if an appropriate conjugate prior density is used. As a result, the solutions can be fou...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied sciences

سال: 2022

ISSN: ['2076-3417']

DOI: https://doi.org/10.3390/app13010287